Maximum A Posteriori inference in graphical models is often solved viamessage-passing algorithms, such as the junction-tree algorithm, or loopybelief-propagation. The exact solution to this problem is well known to beexponential in the size of the model's maximal cliques after it istriangulated, while approximate inference is typically exponential in the sizeof the model's factors. In this paper, we take advantage of the fact that manymodels have maximal cliques that are larger than their constituent factors, andalso of the fact that many factors consist entirely of latent variables (i.e.,they do not depend on an observation). This is a common case in a wide varietyof applications, including grids, trees, and ring-structured models. In suchcases, we are able to decrease the exponent of complexity for message-passingby 0.5 for both exact and approximate inference.
展开▼